Integrating Attributional and Distributional Information in a Probabilistic Model of Meaning Representation
نویسندگان
چکیده
In this paper we present models of how meaning is represented in the brain/mind, based upon the assumption that children develop meaning representations for words using two main sources of information: information derived from their concrete experience with objects and events in the world (which we refer to as attributional information) and information implicitly derived from exposure to language (which we refer to as distributional information). In the first part of the paper we present a model developed using self-organising maps (SOMs) starting from speaker-generated features (properties that speakers considered to be important in defining and describing the meaning of a word). This model captures meaning similarity between words based solely upon attributional information and has been shown to be successful in predicting a number of behavioural semantic effects. In the second part of the paper, we present a probabilistic model that goes beyond attributional information alone, integrating this information with distributional information derived from text corpora. The ability of this integrated model to learn semantic relationships is demonstrated with reference to comparable probabilistic models that use only attributional or distributional information.
منابع مشابه
The Role of Attributional and Distributional Information in Semantic Representation
In recent studies of semantic representation, two distinct sources of information from which we can learn word meanings have been described. We refer to these as attributional and distributional information sources. Attributional information describes the attributes or features associated with referents of words, and is acquired from our interactions with the world. Distributional information d...
متن کاملRepresenting Meaning with a Combination of Logical and Distributional Models
NLP tasks differ in the semantic information they require, and at this time no single semantic representation fulfills all requirements. Logic-based representations characterize sentence structure, but do not capture the graded aspect of meaning. Distributional models give graded similarity ratings for words and phrases, but do not capture sentence structure in the same detail as logic-based ap...
متن کاملIntegrating Logical Representations with Probabilistic Information using Markov Logic
First-order logic provides a powerful and flexible mechanism for representing natural language semantics. However, it is an open question of how best to integrate it with uncertain, probabilistic knowledge, for example regarding word meaning. This paper describes the first steps of an approach to recasting first-order semantics into the probabilistic models that are part of Statistical Relation...
متن کاملRepresenting Meaning with a Combination of Logical Form and Vectors
NLP tasks differ in the semantic information they require, and at this time no single semantic representation fulfills all requirements. Logic-based representations characterize sentence structure, but do not capture the graded aspect of meaning. Distributional models give graded similarity ratings for words and phrases, but do not capture sentence structure in the same detail as logic-based ap...
متن کاملTowards a Matrix-based Distributional Model of Meaning
Vector-based distributional models of semantics have proven useful and adequate in a variety of natural language processing tasks. However, most of them lack at least one key requirement in order to serve as an adequate representation of natural language, namely sensitivity to structural information such as word order. We propose a novel approach that offers a potential of integrating order-dep...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005